4 research outputs found
Residual and Attentional Architectures for Vector-Symbols
Vector-symbolic architectures (VSAs) provide methods for computing which are
highly flexible and carry unique advantages. Concepts in VSAs are represented
by 'symbols,' long vectors of values which utilize properties of
high-dimensional spaces to represent and manipulate information. In this new
work, we combine efficiency of the operations provided within the framework of
the Fourier Holographic Reduced Representation (FHRR) VSA with the power of
deep networks to construct novel VSA based residual and attention-based neural
network architectures. Using an attentional FHRR architecture, we demonstrate
that the same network architecture can address problems from different domains
(image classification and molecular toxicity prediction) by encoding different
information into the network's inputs, similar to the Perceiver model. This
demonstrates a novel application of VSAs and a potential path to implementing
state-of-the-art neural models on neuromorphic hardware.Comment: 6 pages, 7 figures, 1 tabl